Facebook’s controversial study that manipulated users’ newsfeeds was not pre-approved by Cornell University’s ethics board, and Facebook may not have had “implied” user permission to conduct the study as researchers previously claimed.

In the study, researchers at Facebook tweaked what hundreds of thousands of users saw in their news feeds, skewing content to be more positive or negative than normal in an attempt to manipulate their mood. Then they checked users’ status updates to see if the content affected what they wrote. They found that, yes, Facebook users’ moods are affected by what they see in their news feeds. Users who saw more negative posts would write more negative things on their own walls, and likewise for positive posts.

(For a refresher on the controversy, check out The Washington Post’s story from Monday).

Ethics board consulted after the fact

As reported by The Post and other news outlets, Princeton University psychology professor Susan Fiske told the Atlantic that an independent ethics committee, Cornell University’s Institutional Review Board (IRB), had approved use of Facebook’s “pre-existing data set” in the experiment. Fiske edited the study, which was published in the June 17 issue of Proceedings of the National Academy of Sciences.

A statement issued Monday by Cornell University clarified the experiment was conducted before the IRB was consulted. A Cornell professor, Jeffrey Hancock, and doctoral student Jamie Guillory worked with Facebook on the study, but the university made a point of distancing itself from the research. Its statement said:

Professor Hancock and Dr. Guillory did not participate in data collection and did not have access to user data. Their work was limited to initial discussions, analyzing the research results and working with colleagues from Facebook to prepare the peer-reviewed paper “Experimental Evidence of Massive-Scale Emotional Contagion through Social Networks,” published online June 2 in Proceedings of the National Academy of Science-Social Science.

Because the research was conducted independently by Facebook and Professor Hancock had access only to results – and not to any data at any time – Cornell University’s Institutional Review Board concluded that he was not directly engaged in human research and that no review by the Cornell Human Research Protection Program was required.

User consent called into question

Facebook researchers claimed the fine print users agreed to when they signed up was tantamount to “informed consent” to participate in the study. Facebook’s current data use policy says user information can be used for “internal operations” including “research.” However, that’s not what it said in 2012 when the study was conducted. According to Forbes:

In January 2012, the policy did not say anything about users potentially being guinea pigs made to have a crappy day for science, nor that ‘research’ is something that might happen on the platform.

Four months after the study, in May 2012, Facebook made changesto its data use policy, and that’s when it introduced this line about how it might use your information: ‘For internal operations, including troubleshooting, data analysis, testing, research and service improvement.’ Facebook helpfully posted a ‘red-line’ version of the new policy, contrasting it with the prior version from September 2011 — which did not mention anything about user information being used in ‘research.’

“When someone signs up for Facebook, we’ve always asked permission to use their information to provide and enhance the services we offer,” a Facebook spokesman told Forbes. “To suggest we conducted any corporate research without permission is complete fiction. Companies that want to improve their services use the information their customers provide, whether or not their privacy policy uses the word ‘research’ or not.”

This revelation will likely further rile critics already angered that Facebook fell short of the standards imposed by the government and professional associations for informed consent in studies conducted on humans. Informed consent involves disclosing information about the study before it takes place and giving subjects a chance to opt out – and Facebook did neither. Since Facebook is a private company, it isn’t held to those standards, according to legal experts interviewed by the International Business Times, but that hasn’t stopped some from feeling violated and angry.

If international headlines are an accurate gauge of public opinion, people worldwide are angry at Facebook. Here’s a sampling, translated badly by Google Translate:

Germany: Manipulierte Newsfeeds: Facebook, das permanente Psycho-Experiment

Manipulated Newsfeeds: Facebook, the permanent Psycho-Experiment

Czech Republic: Má nás Facebook za laboratorní krysy? Experiment vyvolal bouřlivé reakce

Facebook has for us lab rats? Experiment provoked strong reactions

Brazil: Opinião: O grande problema do Facebook? Cegueira ética

Opinion: The big problem of Facebook? ethical blindness

Sweden: Facebook svarer på massiv kritikk

Facebook responds to massive criticism

Israel: פייסבוק ביצעה ניסוי ברגשות של משתמשים, ועכשיו הם כועסים

Facebook has made trial users’ emotions, and now they’re angry

Hungary: Kísérleti patkányokként kezeli a Facebook felhasználóit? Háborognak az emberek

Experimental patkányokként manages the Facebook users? Indignation of the people

Italy: Facebook “ammette” di manipolare l’umore degli utenti

Facebook “admits” to manipulate the mood of the users

India: आपके इमोशंस से खेल रहा था फेसबुक, न्यूजफीड से की छेड़छाड़!

Facebook was playing with your emotions, feed tampered with!

China: 你 的喜怒哀樂被臉書控制了嗎

Your emotions are controlled face book yet?

France: Des utilisateurs de Facebook « manipulés » pour une expérience psychologique

Facebook users “manipulated” for a psychological experiment